Perturbation theory for Markov chains via Wasserstein distance

نویسندگان

  • DANIEL RUDOLF
  • NIKOLAUS SCHWEIZER
چکیده

Perturbation theory for Markov chains addresses the question of how small differences in the transition probabilities of Markov chains are reflected in differences between their distributions. We prove powerful and flexible bounds on the distance of the nth step distributions of two Markov chains when one of them satisfies a Wasserstein ergodicity condition. Our work is motivated by the recent interest in approximate Markov chain Monte Carlo (MCMC) methods in the analysis of big data sets. By using an approach based on Lyapunov functions, we provide estimates for geometrically ergodic Markov chains under weak assumptions. In an autoregressive model, our bounds cannot be improved in general. We illustrate our theory by showing quantitative estimates for approximate versions of two prominent MCMC algorithms, the Metropolis-Hastings and stochastic Langevin algorithms.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Approximating Markov chains and V -geometric ergodicity via weak perturbation theory

Let P be a Markov kernel on a measurable space X and let V : X→[1,+∞). This paper provides explicit connections between the V -geometric ergodicity of P and that of finite-rank nonnegative sub-Markov kernels P̂k approximating P . A special attention is paid to obtain an efficient way to specify the convergence rate for P from that of P̂k and conversely. Furthermore, explicit bounds are obtained f...

متن کامل

Computable upper bounds on the distance to stationarity for Jovanovski and Madras’s Gibbs sampler

An upper bound on the Wasserstein distance to stationarity is developed for a class of Markov chains on R. This result, which is a generalization of Diaconis et al.’s (2009) Theorem 2.2, is applied to a Gibbs sampler Markov chain that was introduced and analyzed by Jovanovski and Madras (2014). The resulting Wasserstein bound is converted into a total variation bound (using results from Madras ...

متن کامل

The Rate of Rényi Entropy for Irreducible Markov Chains

In this paper, we obtain the Rényi entropy rate for irreducible-aperiodic Markov chains with countable state space, using the theory of countable nonnegative matrices. We also obtain the bound for the rate of Rényi entropy of an irreducible Markov chain. Finally, we show that the bound for the Rényi entropy rate is the Shannon entropy rate.

متن کامل

On the Mean Speed of Convergence of Empirical and Occupation Measures in Wasserstein Distance

In this work, we provide non-asymptotic bounds for the average speed of convergence of the empirical measure in the law of large numbers, in Wasserstein distance. We also consider occupation measures of ergodic Markov chains. One motivation is the approximation of a probability measure by nitely supported measures (the quantization problem). It is found that rates for empirical or occupation me...

متن کامل

A Distance for HMMs Based on Aggregated Wasserstein Metric and State Registration

We propose a framework, named Aggregated Wasserstein, for computing a dissimilarity measure or distance between two Hidden Markov Models with state conditional distributions being Gaussian. For such HMMs, the marginal distribution at any time position follows a Gaussian mixture distribution, a fact exploited to softly match, aka register, the states in two HMMs. We refer to such HMMs as Gaussia...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017